Minimum relative entropy distributions with a large mean are Gaussian.
نویسنده
چکیده
Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q, find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean(p) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H-type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.
منابع مشابه
State distributions and minimum relative entropy noise sequences in uncertain stochastic systems: the discrete time case
The paper is concerned with a dissipativity theory and robust performance analysis of discrete-time stochastic systems driven by a statistically uncertain random noise. The uncertainty is quantified by the conditional relative entropy of the actual probability law of the noise with respect to a nominal product measure corresponding to a white noise sequence. We discuss a balance equation, dissi...
متن کاملProjective Power Entropy and Maximum Tsallis Entropy Distributions
We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterizati...
متن کاملEntropy-constrained scalar quantization and minimum entropy with error bound by discrete wavelet transforms in image compression
The global maximum of an entropy function with different decision levels for a three-level scalar quantizer performed after a discrete wavelet transform was derived. Herein, we considered the case of entropy-constrained scalar quantization capable of avoiding many compression ratio reductions as the mean squared error was minimized. We also dealt with the problem of minimum entropy with an erro...
متن کاملBayesian Estimation of Shift Point in Shape Parameter of Inverse Gaussian Distribution Under Different Loss Functions
In this paper, a Bayesian approach is proposed for shift point detection in an inverse Gaussian distribution. In this study, the mean parameter of inverse Gaussian distribution is assumed to be constant and shift points in shape parameter is considered. First the posterior distribution of shape parameter is obtained. Then the Bayes estimators are derived under a class of priors and using variou...
متن کاملMinimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations
Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower bounds on the entropy for the minimum entropy distribution over arbitrarily large collection...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Physical review. E
دوره 94 6-1 شماره
صفحات -
تاریخ انتشار 2016